18 research outputs found

    Spontaneous Blinks Activate the Precuneus: Characterizing Blink-Related Oscillations Using Magnetoencephalography

    Get PDF
    Spontaneous blinking occurs 15–20 times per minute. Although blinking has often been associated with its physiological role of corneal lubrication, there is now increasing behavioral evidence suggesting that blinks are also modulated by cognitive processes such as attention and information processing. Recent low-density electroencephalography (EEG) studies have reported so-called blink-related oscillations (BROs) associated with spontaneous blinking at rest. Delta-band (0.5–4 Hz) BROs are thought to originate from the precuneus region involved in environmental monitoring and awareness, with potential clinical utility in evaluation of disorders of consciousness. However, the neural mechanisms of BROs have not been elucidated. Using magnetoencephalography (MEG), we characterized delta-band BROs in 36 healthy individuals while controlling for background brain activity. Results showed that, compared to pre-blink baseline, delta-band BROs resulted in increased global field power (p < 0.001) and time-frequency spectral power (p < 0.05) at the sensor level, peaking at ∌250 ms post-blink maximum. Source localization showed that spontaneous blinks activated the bilateral precuneus (p < 0.05 FWE), and source activity within the precuneus was also consistent with sensor-space results. Crucially, these effects were only observed in the blink condition and were absent in the control condition, demonstrating that results were due to spontaneous blinks rather than as part of the inherent brain activity. The current study represents the first MEG examination of BROs. Our findings suggest that spontaneous blinks activate the precuneus regions consistent with environmental monitoring and awareness, and provide important neuroimaging support for the cognitive role of spontaneous blinks

    Multimodal Characterization of the Semantic N400 Response within a Rapid Evaluation Brain Vital Sign Framework

    Get PDF
    Background: For nearly four decades, the N400 has been an important brainwave marker of semantic processing. It can be recorded non-invasively from the scalp using electrical and/or magnetic sensors, but largely within the restricted domain of research laboratories specialized to run specifc N400 experiments. However, there is increasing evidence of signifcant clinical utility for the N400 in neurological evaluation, particularly at the individual level. To enable clinical applications, we recently reported a rapid evaluation framework known as “brain vital signs” that successfully incorporated the N400 response as one of the core components for cognitive function evaluation. The current study characterized the rapidly evoked N400 response to demonstrate that it shares consistent features with traditional N400 responses acquired in research laboratory settings—thereby enabling its translation into brain vital signs applications. Methods: Data were collected from 17 healthy individuals using magnetoencephalography (MEG) and electroencephalography (EEG), with analysis of sensor-level efects as well as evaluation of brain sources. Individual-level N400 responses were classifed using machine learning to determine the percentage of participants in whom the response was successfully detected. Results: The N400 response was observed in both M/EEG modalities showing signifcant diferences to incongruent versus congruent condition in the expected time range (p<0.05). Also as expected, N400-related brain activity was observed in the temporal and inferior frontal cortical regions, with typical left-hemispheric asymmetry. Classifcation robustly confrmed the N400 efect at the individual level with high accuracy (89%), sensitivity (0.88) and specifcity (0.90). Conclusion: The brain vital sign N400 characteristics were highly consistent with features of the previously reported N400 responses acquired using traditional laboratory-based experiments. These results provide important evidence supporting clinical translation of the rapidly acquired N400 response as a potential tool for assessments of higher cognitive functions

    Brain Vital Signs: Expanding From the Auditory to Visual Modality

    Get PDF
    The critical need for rapid objective, physiological evaluation of brain function at point-of-care has led to the emergence of brain vital signs—a framework encompassing a portable electroencephalography (EEG) and an automated, quick test protocol. This framework enables access to well-established event-related potential (ERP) markers, which are specific to sensory, attention, and cognitive functions in both healthy and patient populations. However, all our applications to-date have used auditory stimulation, which have highlighted application challenges in persons with hearing impairments (e.g., aging, seniors, dementia). Consequently, it has become important to translate brain vital signs into a visual sensory modality. Therefore, the objectives of this study were to: 1) demonstrate the feasibility of visual brain vital signs; and 2) compare and normalize results from visual and auditory brain vital signs. Data were collected from 34 healthy adults (33 ± 13 years) using a 64-channel EEG system. Visual and auditory sequences were kept as comparable as possible to elicit the N100, P300, and N400 responses. Visual brain vital signs were elicited successfully for all three responses across the group (N100: F = 29.8380, p < 0.001; P300: F = 138.8442, p < 0.0001; N400: F = 6.8476, p = 0.01). Initial auditory-visual comparisons across the three components showed attention processing (P300) was found to be the most transferrable across modalities, with no group-level differences and correlated peak amplitudes (rho = 0.7, p = 0.0001) across individuals. Auditory P300 latencies were shorter than visual (p < 0.0001) but normalization and correlation (r = 0.5, p = 0.0033) implied a potential systematic difference across modalities. Reduced auditory N400 amplitudes compared to visual (p = 0.0061) paired with normalization and correlation across individuals (r = 0.6, p = 0.0012), also revealed potential systematic modality differences between reading and listening language comprehension. This study provides an initial understanding of the relationship between the visual and auditory sequences, while importantly establishing a visual sequence within the brain vital signs framework. With both auditory and visual stimulation capabilities available, it is possible to broaden applications across the lifespan

    Brain Vital Signs: Expanding From the Auditory to Visual Modality

    Get PDF
    The critical need for rapid objective, physiological evaluation of brain function at point-of-care has led to the emergence of brain vital signs—a framework encompassing a portable electroencephalography (EEG) and an automated, quick test protocol. This framework enables access to well-established event-related potential (ERP) markers, which are specific to sensory, attention, and cognitive functions in both healthy and patient populations. However, all our applications to-date have used auditory stimulation, which have highlighted application challenges in persons with hearing impairments (e.g., aging, seniors, dementia). Consequently, it has become important to translate brain vital signs into a visual sensory modality. Therefore, the objectives of this study were to: 1) demonstrate the feasibility of visual brain vital signs; and 2) compare and normalize results from visual and auditory brain vital signs. Data were collected from 34 healthy adults (33 ± 13 years) using a 64-channel EEG system. Visual and auditory sequences were kept as comparable as possible to elicit the N100, P300, and N400 responses. Visual brain vital signs were elicited successfully for all three responses across the group (N100: F = 29.8380, p < 0.001; P300: F = 138.8442, p < 0.0001; N400: F = 6.8476, p = 0.01). Initial auditory-visual comparisons across the three components showed attention processing (P300) was found to be the most transferrable across modalities, with no group-level differences and correlated peak amplitudes (rho = 0.7, p = 0.0001) across individuals. Auditory P300 latencies were shorter than visual (p < 0.0001) but normalization and correlation (r = 0.5, p = 0.0033) implied a potential systematic difference across modalities. Reduced auditory N400 amplitudes compared to visual (p = 0.0061) paired with normalization and correlation across individuals (r = 0.6, p = 0.0012), also revealed potential systematic modality differences between reading and listening language comprehension. This study provides an initial understanding of the relationship between the visual and auditory sequences, while importantly establishing a visual sequence within the brain vital signs framework. With both auditory and visual stimulation capabilities available, it is possible to broaden applications across the lifespan. The critical need for rapid objective, physiological evaluation of brain function at point-of-care has led to the emergence of brain vital signs—a framework encompassing a portable electroencephalography (EEG) and an automated, quick test protocol. This framework enables access to well-established event-related potential (ERP) markers, which are specific to sensory, attention, and cognitive functions in both healthy and patient populations. However, all our applications to-date have used auditory stimulation, which have highlighted application challenges in persons with hearing impairments (e.g., aging, seniors, dementia). Consequently, it has become important to translate brain vital signs into a visual sensory modality. Therefore, the objectives of this study were to: 1) demonstrate the feasibility of visual brain vital signs; and 2) compare and normalize results from visual and auditory brain vital signs. Data were collected from 34 healthy adults (33 ± 13 years) using a 64-channel EEG system. Visual and auditory sequences were kept as comparable as possible to elicit the N100, P300, and N400 responses. Visual brain vital signs were elicited successfully for all three responses across the group (N100: F = 29.8380, p < 0.001; P300: F = 138.8442, p < 0.0001; N400: F = 6.8476, p = 0.01). Initial auditory-visual comparisons across the three components showed attention processing (P300) was found to be the most transferrable across modalities, with no group-level differences and correlated peak amplitudes (rho = 0.7, p = 0.0001) across individuals. Auditory P300 latencies were shorter than visual (p < 0.0001) but normalization and correlation (r = 0.5, p = 0.0033) implied a potential systematic difference across modalities. Reduced auditory N400 amplitudes compared to visual (p = 0.0061) paired with normalization and correlation across individuals (r = 0.6, p = 0.0012), also revealed potential systematic modality differences between reading and listening language comprehension. This study provides an initial understanding of the relationship between the visual and auditory sequences, while importantly establishing a visual sequence within the brain vital signs framework. With both auditory and visual stimulation capabilities available, it is possible to broaden applications across the lifespan

    Developing Brain Vital Signs: Initial Framework for Monitoring Brain Function Changes over Time

    Get PDF
    Clinical assessment of brain function relies heavily on indirect behavior-based tests. Unfortunately, behavior-based assessments are subjective and therefore susceptible to several confounding factors. Event-related brain potentials (ERPs), derived from electroencephalography (EEG), are often used to provide objective, physiological measures of brain function. Historically, ERPs have been characterized extensively within research settings, with limited but growing clinical applications. Over the past 20 years, we have developed clinical ERP applications for the evaluation of functional status following serious injury and/or disease. This work has identified an important gap: the need for a clinically accessible framework to evaluate ERP measures. Crucially, this enables baseline measures before brain dysfunction occurs, and might enable the routine collection of brain function metrics in the future much like blood pressure measures today. Here, we propose such a framework for extracting specific ERPs as potential “brain vital signs.” This framework enabled the translation/transformation of complex ERP data into accessible metrics of brain function for wider clinical utilization. To formalize the framework, three essential ERPs were selected as initial indicators: (1) the auditory N100 (Auditory sensation); (2) the auditory oddball P300 (Basic attention); and (3) the auditory speech processing N400 (Cognitive processing). First step validation was conducted on healthy younger and older adults (age range: 22–82 years). Results confirmed specific ERPs at the individual level (86.81–98.96%), verified predictable age-related differences (P300 latency delays in older adults, p < 0.05), and demonstrated successful linear transformation into the proposed brain vital sign (BVS) framework (basic attention latency sub-component of BVS framework reflects delays in older adults, p < 0.05). The findings represent an initial critical step in developing, extracting, and characterizing ERPs as vital signs, critical for subsequent evaluation of dysfunction in conditions like concussion and/or dementia

    Brain vital signs: Towards next generation neurotechnologies for rapid brain function assessments at point-of-care

    Get PDF
    Vital signs such as heart rate, blood pressure and body temperature have revolutionized medical care by providing rapidly assessed, physiology-based, non-invasive and easy-to-understand standardized metrics of different body functions. However, no such vital sign exists for the brain; instead, assessments of the brain are largely reliant on surrogate measures such as observations of behaviour or questionnaire-based measurements, which have been shown to be subjective and unreliable. This research aims to fill this key scientific, clinical, and technological gap by developing a brainwave-based technology platform to evaluate ‘vital sign’ metrics for the brain. A series of studies were undertaken to create and demonstrate a ‘brain vital signs’ platform that is capable of assessing a broad spectrum of functions ranging from the lower-level functions (i.e. sensation) to the highest-level cognition domains (i.e. contextual orientation). In particular, the first study focused on development and initial demonstration of the methods and apparatus for the brain vital signs technology; the next study focused on characterizing the brain vital sign responses to ensure scientific validity; the third study focused on creating a previously non-existant neurophysiology-based neural marker capable of capturing contextual orientation – which is the highest level cognitive domain known to be crucial to frontline clinical assessments; and finally, the last study focused on developing an advanced data analytic technique for maximizing signal capture under noisy environments typical of point-of-care evaluation settings. This research represents the first time that a ‘vital sign’-like metric has been developed for the brain that embodies the key characteristics of existing vital signs, enabling brain function measures that are rapid (~5 minute testing time), easy to use, portable, non-invasive, and standardized with automated analysis. Crucially, these vital sign metrics directly measure the brain’s electrical activity and do not depend on any responses from the test participant, thus providing much more objective information about brain function. The development of portable and objective ‘vital sign’-like metrics for the brain not only advances the scientific understanding of brain function through novel metrics like orientation, but also creates significant opportunities for enhancing clinical diagnosis through improved brain function assessments at the point-of-care

    Blink-Related Oscillations Provide Naturalistic Assessments of Brain Function and Cognitive Workload within Complex Real-World Multitasking Environments

    No full text
    Background: There is a significant need to monitor human cognitive performance in complex environments, with one example being pilot performance. However, existing assessments largely focus on subjective experiences (e.g., questionnaires) and the evaluation of behavior (e.g., aircraft handling) as surrogates for cognition or utilize brainwave measures which require artificial setups (e.g., simultaneous auditory stimuli) that intrude on the primary tasks. Blink-related oscillations (BROs) are a recently discovered neural phenomenon associated with spontaneous blinking that can be captured without artificial setups and are also modulated by cognitive loading and the external sensory environment—making them ideal for brain function assessment within complex operational settings. Methods: Electroencephalography (EEG) data were recorded from eight adult participants (five F, M = 21.1 years) while they completed the Multi-Attribute Task Battery under three different cognitive loading conditions. BRO responses in time and frequency domains were derived from the EEG data, and comparisons of BRO responses across cognitive loading conditions were undertaken. Simultaneously, assessments of blink behavior were also undertaken. Results: Blink behavior assessments revealed decreasing blink rate with increasing cognitive load (p p p p < 0.05). Conclusion: This study confirms the ability of BRO responses to capture cognitive loading effects as well as preparatory pre-blink cognitive processes in anticipation of the upcoming blink during a complex multitasking situation. These successful results suggest that blink-related neural processing could be a potential avenue for cognitive state evaluation in operational settings—both specialized environments such as cockpits, space exploration, military units, etc. and everyday situations such as driving, athletics, human-machine interactions, etc.—where human cognition needs to be seamlessly monitored and optimized

    Improved localization accuracy in magnetic source imaging using a 3-D laser scanner

    No full text
    Brain source localization accuracy in magnetoencephalography (MEG) requires accuracy in both digitizing anatomical landmarks and coregistering to anatomical magnetic resonance images (MRI). We compared the source localization accuracy and MEG-MRI coregistration accuracy of two head digitization systems\u2014a laser scanner and the current standard electromagnetic digitization system (Polhemus)\u2014using a calibrated phantom and human data. When compared using the calibrated phantom, surface and source localization accuracy for data acquired with the laser scanner improved over the Polhemus by 141% and 132%, respectively. Laser scan digitization reduced MEG source localization error by 1.38 mm on average. In human participants, a laser scan of the face generated a 1000-fold more points per unit time than the Polhemus head digitization. An automated surfacematching algorithm improved the accuracy of MEG-MRI coregistration over the equivalent manual procedure. Simulations showed that the laser scan coverage could be reduced to an area around the eyes only while maintaining coregistration accuracy, suggesting that acquisition time can be substantially reduced. Our results show that the laser scanner can both reduce setup time and improve localization accuracy, in comparison to the Polhemus digitization system.Peer reviewed: YesNRC publication: Ye

    Towards brain first-aid : a diagnostic device for conscious awareness

    No full text
    When the brain is damaged, evaluating an individual's level of awareness can be a major diagnostic challenge (Is he or she in there?). Existing tests typically rely on behavioral indicators, which are incorrect in as many as one out of every two cases. The current paper presents a diagnostic device that addresses this problem. The technology circumvents behavioral limitations through noninvasive brain wave measurements (electroencephalography, or EEG). Unlike traditional EEG, the device is designed for point-of-care use by incorporating a portable, user-friendly, and stable design. It uses a novel software algorithm that automates subject stimulation, data acquisition/analysis, and the reporting of results. The test provides indicators for five identifiable levels of neural processing: sensation, perception, attention, memory, and language. The results are provided as rapidly obtained diagnostic, reliability, validity, and prognostic scores. The device can be applied to a wide variety of patients across a host of different environments. The technology is designed to be wireless-enabled for remote monitoring and assessment capabilities. In essence, the device is developed to scan for conscious awareness in order to optimize subsequent patient care.Peer reviewed: YesNRC publication: Ye

    Towards Brain First-Aid: A Diagnostic Device for Conscious Awareness

    No full text
    corecore